HHMM Parsing with Limited Parallelism
نویسندگان
چکیده
Hierarchical Hidden Markov Model (HHMM) parsers have been proposed as psycholinguistic models due to their broad coverage within human-like working memory limits (Schuler et al., 2008) and ability to model human reading time behavior according to various complexity metrics (Wu et al., 2010). But HHMMs have been evaluated previously only with very wide beams of several thousand parallel hypotheses, weakening claims to the model’s efficiency and psychological relevance. This paper examines the effects of varying beam width on parsing accuracy and speed in this model, showing that parsing accuracy degrades gracefully as beam width decreases dramatically (to 2% of the width used to achieve previous top results), without sacrificing gains over a baseline CKY parser.
منابع مشابه
Complexity Metrics in an Incremental Right-Corner Parser
Hierarchical HMM (HHMM) parsers make promising cognitive models: while they use a bounded model of working memory and pursue incremental hypotheses in parallel, they still achieve parsing accuracies competitive with chart-based techniques. This paper aims to validate that a right-corner HHMM parser is also able to produce complexity metrics, which quantify a reader’s incremental difficulty in u...
متن کاملA Syntactic Time-Series Model for Parsing Fluent and Disfluent Speech
This paper describes an incremental approach to parsing transcribed spontaneous speech containing disfluencies with a Hierarchical Hidden Markov Model (HHMM). This model makes use of the right-corner transform, which has been shown to increase non-incremental parsing accuracy on transcribed spontaneous speech (Miller and Schuler, 2008), using trees transformed in this manner to train the HHMM p...
متن کاملAn Empirical Evaluation of HHMM Parsing Time
Current state of the art speech recognition systems use very little structural linguistic information while doing word recognition. Some systems attempt to apply syntactic and semantic analysis to speech, but this is typically done in a pipelined approach, where there is thresholding done in between each stage. It would be advantageous to make use of information about higher level linguistic st...
متن کاملA Probabilistic Model of Parallelism 1 Running head: A PROBABILISTIC MODEL OF PARALLELISM A Probabilistic Corpus-based Model of Syntactic Parallelism
Work in experimental psycholinguistics has shown that the processing of coordinate structures is facilitated when the two conjuncts share the same syntactic structure (Frazier, Munn, & Clifton, 2000). In the present paper, we argue that this parallelism effect is a specific case of the more general phenomenon of syntactic priming—the tendency to repeat recently used syntactic structures. We sho...
متن کاملParsing in Parallel on Multiple Cores and GPUs
This paper examines the ways in which parallelism can be used to speed the parsing of dense PCFGs. We focus on two kinds of parallelism here: Symmetric Multi-Processing (SMP) parallelism on shared-memory multicore CPUs, and Single-Instruction MultipleThread (SIMT) parallelism on GPUs. We describe how to achieve speed-ups over an already very efficient baseline parser using both kinds of technol...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2010